Evaluation of Poincare Embeddings

This notebook demonstrates how well Poincare embeddings perform on the tasks detailed in the original paper about the embeddings.

The following two external, open-source implementations are used -

  1. C++
  2. Numpy

This is the list of tasks -

  1. WordNet reconstruction
  2. WordNet link prediction
  3. Link prediction in collaboration networks (evaluation incomplete)
  4. Lexical entailment on HyperLex

A more detailed explanation of the tasks and the evaluation methodology is present in the individual evaluation subsections.

1. Setup

The following section performs the following -

  1. Imports required python libraries and downloads the wordnet data
  2. Clones the repositories containing the C++ and Numpy implementations of the Poincare embeddings
  3. Applies patches containing minor changes to the implementations.
  4. Compiles the C++ sources to create a binary

In [1]:
%cd ../..


/home/misha/git/gensim

In [2]:
# Some libraries need to be installed that are not part of Gensim
! pip install click>=6.7 nltk>=3.2.5 prettytable>=0.7.2 pygtrie>=2.2


You are using pip version 19.0.1, however version 19.1 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.

In [3]:
import csv
from collections import OrderedDict
from IPython.display import display, HTML
import logging
import os
import pickle
import random
import re

import click
from gensim.models.poincare import PoincareModel, PoincareRelations, \
    ReconstructionEvaluation, LinkPredictionEvaluation, \
    LexicalEntailmentEvaluation, PoincareKeyedVectors
from gensim.utils import check_output
import nltk
from prettytable import PrettyTable
from smart_open import smart_open

logging.basicConfig(level=logging.INFO)
nltk.download('wordnet')


[nltk_data] Downloading package wordnet to /home/misha/nltk_data...
[nltk_data]   Package wordnet is already up-to-date!
Out[3]:
True

Please set the variable parent_directory below to change the directory to which the repositories are cloned.


In [4]:
%cd docs/notebooks/


/home/misha/git/gensim/docs/notebooks

In [5]:
current_directory = os.getcwd()

In [6]:
# Change this variable to `False` to not remove and re-download repos for external implementations
force_setup = False

# The poincare datasets, models and source code for external models are downloaded to this directory
parent_directory = os.path.join(current_directory, 'poincare')
! mkdir -p {parent_directory}

In [7]:
%cd {parent_directory}

# Clone repos
np_repo_name = 'poincare-np-embedding'
if force_setup and os.path.exists(np_repo_name):
    ! rm -rf {np_repo_name}
clone_np_repo = not os.path.exists(np_repo_name)
if clone_np_repo:
    ! git clone https://github.com/nishnik/poincare_embeddings.git {np_repo_name}

cpp_repo_name = 'poincare-cpp-embedding'
if force_setup and os.path.exists(cpp_repo_name):
    ! rm -rf {cpp_repo_name}
clone_cpp_repo = not os.path.exists(cpp_repo_name)
if clone_cpp_repo:
    ! git clone https://github.com/TatsuyaShirakawa/poincare-embedding.git {cpp_repo_name}

patches_applied = False


/home/misha/git/gensim/docs/notebooks/poincare

In [8]:
# Apply patches
if clone_cpp_repo and not patches_applied:
    %cd {cpp_repo_name}
    ! git apply ../poincare_burn_in_eps.patch

if clone_np_repo and not patches_applied:
    %cd ../{np_repo_name}
    ! git apply ../poincare_numpy.patch
    
patches_applied = True

In [9]:
# Compile the code for the external c++ implementation into a binary
%cd {parent_directory}/{cpp_repo_name}
!mkdir -p work
%cd work
!cmake ..
!make
%cd {current_directory}


/home/misha/git/gensim/docs/notebooks/poincare/poincare-cpp-embedding
/home/misha/git/gensim/docs/notebooks/poincare/poincare-cpp-embedding/work
-- The C compiler identification is GNU 7.4.0
-- The CXX compiler identification is GNU 7.4.0
-- Check for working C compiler: /usr/bin/cc
-- Check for working C compiler: /usr/bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Check for working CXX compiler: /usr/bin/c++
-- Check for working CXX compiler: /usr/bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Looking for pthread.h
-- Looking for pthread.h - found
-- Looking for pthread_create
-- Looking for pthread_create - not found
-- Check if compiler accepts -pthread
-- Check if compiler accepts -pthread - yes
-- Found Threads: TRUE  
-- Configuring done
-- Generating done
-- Build files have been written to: /home/misha/git/gensim/docs/notebooks/poincare/poincare-cpp-embedding/work
Scanning dependencies of target poincare_embedding
[ 50%] Building CXX object CMakeFiles/poincare_embedding.dir/src/poincare_embedding.cpp.o
[100%] Linking CXX executable poincare_embedding
[100%] Built target poincare_embedding
/home/misha/git/gensim/docs/notebooks

You might need to install an updated version of cmake to be able to compile the source code. Please make sure that the binary poincare_embedding has been created before proceeding by verifying the above cell does not raise an error.


In [ ]:
cpp_binary_path = os.path.join(parent_directory, cpp_repo_name, 'work', 'poincare_embedding')
assert(os.path.exists(cpp_binary_path)), 'Binary file doesnt exist at %s' % cpp_binary_path

2. Training

2.1 Create the data


In [ ]:
# These directories are auto created in the current directory for storing poincare datasets and models
data_directory = os.path.join(parent_directory, 'data')
models_directory = os.path.join(parent_directory, 'models')

# Create directories
! mkdir -p {data_directory}
! mkdir -p {models_directory}

In [ ]:
# Prepare the WordNet data
# Can also be downloaded directly from -
# https://github.com/jayantj/gensim/raw/wordnet_data/docs/notebooks/poincare/data/wordnet_noun_hypernyms.tsv

wordnet_file = os.path.join(data_directory, 'wordnet_noun_hypernyms.tsv')
if not os.path.exists(wordnet_file):
    ! python {parent_directory}/{cpp_repo_name}/scripts/create_wordnet_noun_hierarchy.py {wordnet_file}


82115 nouns
743241 hypernyms

In [ ]:
# Prepare the HyperLex data
hyperlex_url = "http://people.ds.cam.ac.uk/iv250/paper/hyperlex/hyperlex-data.zip"
! wget {hyperlex_url} -O {data_directory}/hyperlex-data.zip
if os.path.exists(os.path.join(data_directory, 'hyperlex')):
    ! rm -r {data_directory}/hyperlex
! unzip {data_directory}/hyperlex-data.zip -d {data_directory}/hyperlex/
hyperlex_file = os.path.join(data_directory, 'hyperlex', 'nouns-verbs', 'hyperlex-nouns.txt')


--2019-05-10 12:18:20--  http://people.ds.cam.ac.uk/iv250/paper/hyperlex/hyperlex-data.zip
Resolving people.ds.cam.ac.uk (people.ds.cam.ac.uk)... 131.111.3.47
Connecting to people.ds.cam.ac.uk (people.ds.cam.ac.uk)|131.111.3.47|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 183900 (180K) [application/zip]
Saving to: ‘/home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex-data.zip’

/home/misha/git/gen 100%[===================>] 179.59K   158KB/s    in 1.1s    

2019-05-10 12:18:22 (158 KB/s) - ‘/home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex-data.zip’ saved [183900/183900]

Archive:  /home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex-data.zip
   creating: /home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex/nouns-verbs/
  inflating: /home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex/nouns-verbs/hyperlex-verbs.txt  
  inflating: /home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex/nouns-verbs/hyperlex-nouns.txt  
   creating: /home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex/splits/
   creating: /home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex/splits/random/
  inflating: /home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex/splits/random/hyperlex_training_all_random.txt  
  inflating: /home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex/splits/random/hyperlex_test_all_random.txt  
  inflating: /home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex/splits/random/hyperlex_dev_all_random.txt  
   creating: /home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex/splits/lexical/
  inflating: /home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex/splits/lexical/hyperlex_dev_all_lexical.txt  
  inflating: /home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex/splits/lexical/hyperlex_test_all_lexical.txt  
  inflating: /home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex/splits/lexical/hyperlex_training_all_lexical.txt  
  inflating: /home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex/hyperlex-all.txt  
  inflating: /home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex/README.txt  

2.2 Training C++ embeddings


In [ ]:
def train_cpp_model(
    binary_path, data_file, output_file, dim, epochs, neg,
    num_threads, epsilon, burn_in, seed=0):
    """Train a poincare embedding using the c++ implementation
    
    Args:
        binary_path (str): Path to the compiled c++ implementation binary
        data_file (str): Path to tsv file containing relation pairs
        output_file (str): Path to output file containing model
        dim (int): Number of dimensions of the trained model
        epochs (int): Number of epochs to use
        neg (int): Number of negative samples to use
        num_threads (int): Number of threads to use for training the model
        epsilon (float): Constant used for clipping below a norm of one
        burn_in (int): Number of epochs to use for burn-in init (0 means no burn-in)
    
    Notes: 
        If `output_file` already exists, skips training
    """
    if os.path.exists(output_file):
        print('File %s exists, skipping' % output_file)
        return
    args = {
        'dim': dim,
        'max_epoch': epochs,
        'neg_size': neg,
        'num_thread': num_threads,
        'epsilon': epsilon,
        'burn_in': burn_in,
        'learning_rate_init': 0.1,
        'learning_rate_final': 0.0001,
    }
    cmd = [binary_path, data_file, output_file]
    for option, value in args.items():
        cmd.append("--%s" % option)
        cmd.append(str(value))
    
    return check_output(args=cmd)

In [ ]:
model_sizes = [5, 10, 20, 50, 100, 200]
default_params = {
    'neg': 20,
    'epochs': 50,
    'threads': 8,
    'eps': 1e-6,
    'burn_in': 0,
    'batch_size': 10,
    'reg': 0.0
}

non_default_params = {
    'neg': [10],
    'epochs': [200],
    'burn_in': [10]
}

In [ ]:
def cpp_model_name_from_params(params, prefix):
    param_keys = ['burn_in', 'epochs', 'neg', 'eps', 'threads']
    name = ['%s_%s' % (key, params[key]) for key in sorted(param_keys)]
    return '%s_%s' % (prefix, '_'.join(name))

def train_model_with_params(params, train_file, model_sizes, prefix, implementation):
    """Trains models with given params for multiple model sizes using the given implementation
    
    Args:
        params (dict): parameters to train the model with
        train_file (str): Path to tsv file containing relation pairs
        model_sizes (list): list of dimension sizes (integer) to train the model with
        prefix (str): prefix to use for the saved model filenames
        implementation (str): whether to use the numpy or c++ implementation,
                              allowed values: 'numpy', 'c++'
   
   Returns:
        tuple (model_name, model_files)
        model_files is a dict of (size, filename) pairs
        Example: ('cpp_model_epochs_50', {5: 'models/cpp_model_epochs_50_dim_5'})
    """
    files = {}
    if implementation == 'c++':
        model_name = cpp_model_name_from_params(params, prefix)
    elif implementation == 'numpy':
        model_name = np_model_name_from_params(params, prefix)
    elif implementation == 'gensim':
        model_name = gensim_model_name_from_params(params, prefix)
    else:
        raise ValueError('Given implementation %s not found' % implementation)
    for model_size in model_sizes:
        output_file_name = '%s_dim_%d' % (model_name, model_size)
        output_file = os.path.join(models_directory, output_file_name)
        print('Training model %s of size %d' % (model_name, model_size))
        if implementation == 'c++':
            out = train_cpp_model(
                cpp_binary_path, train_file, output_file, model_size,
                params['epochs'], params['neg'], params['threads'],
                params['eps'], params['burn_in'], seed=0)
        elif implementation == 'numpy':
            train_external_numpy_model(
                python_script_path, train_file, output_file, model_size,
                params['epochs'], params['neg'], seed=0)
        elif implementation == 'gensim':
            train_gensim_model(
                train_file, output_file, model_size, params['epochs'],
                params['neg'], params['burn_in'], params['batch_size'], params['reg'], seed=0)
        else:
            raise ValueError('Given implementation %s not found' % implementation)
        files[model_size] = output_file
    return (model_name, files)

In [ ]:
model_files = {}

In [ ]:
model_files['c++'] = {}
# Train c++ models with default params
model_name, files = train_model_with_params(default_params, wordnet_file, model_sizes, 'cpp_model', 'c++')
model_files['c++'][model_name] = {}
for dim, filepath in files.items():
    model_files['c++'][model_name][dim] = filepath
# Train c++ models with non-default params
for param, values in non_default_params.items():
    params = default_params.copy()
    for value in values:
        params[param] = value
        model_name, files = train_model_with_params(params, wordnet_file, model_sizes, 'cpp_model', 'c++')
        model_files['c++'][model_name] = {}
        for dim, filepath in files.items():
            model_files['c++'][model_name][dim] = filepath


Training model cpp_model_burn_in_0_epochs_50_eps_1e-06_neg_20_threads_8 of size 5
Training model cpp_model_burn_in_0_epochs_50_eps_1e-06_neg_20_threads_8 of size 10
Training model cpp_model_burn_in_0_epochs_50_eps_1e-06_neg_20_threads_8 of size 20
Training model cpp_model_burn_in_0_epochs_50_eps_1e-06_neg_20_threads_8 of size 50
Training model cpp_model_burn_in_0_epochs_50_eps_1e-06_neg_20_threads_8 of size 100
Training model cpp_model_burn_in_0_epochs_50_eps_1e-06_neg_20_threads_8 of size 200
Training model cpp_model_burn_in_0_epochs_50_eps_1e-06_neg_10_threads_8 of size 5
Training model cpp_model_burn_in_0_epochs_50_eps_1e-06_neg_10_threads_8 of size 10
Training model cpp_model_burn_in_0_epochs_50_eps_1e-06_neg_10_threads_8 of size 20
Training model cpp_model_burn_in_0_epochs_50_eps_1e-06_neg_10_threads_8 of size 50
Training model cpp_model_burn_in_0_epochs_50_eps_1e-06_neg_10_threads_8 of size 100
Training model cpp_model_burn_in_0_epochs_50_eps_1e-06_neg_10_threads_8 of size 200
Training model cpp_model_burn_in_0_epochs_200_eps_1e-06_neg_20_threads_8 of size 5
Training model cpp_model_burn_in_0_epochs_200_eps_1e-06_neg_20_threads_8 of size 10
Training model cpp_model_burn_in_0_epochs_200_eps_1e-06_neg_20_threads_8 of size 20
Training model cpp_model_burn_in_0_epochs_200_eps_1e-06_neg_20_threads_8 of size 50
Training model cpp_model_burn_in_0_epochs_200_eps_1e-06_neg_20_threads_8 of size 100
Training model cpp_model_burn_in_0_epochs_200_eps_1e-06_neg_20_threads_8 of size 200
Training model cpp_model_burn_in_10_epochs_50_eps_1e-06_neg_20_threads_8 of size 5
Training model cpp_model_burn_in_10_epochs_50_eps_1e-06_neg_20_threads_8 of size 10
Training model cpp_model_burn_in_10_epochs_50_eps_1e-06_neg_20_threads_8 of size 20
Training model cpp_model_burn_in_10_epochs_50_eps_1e-06_neg_20_threads_8 of size 50
Training model cpp_model_burn_in_10_epochs_50_eps_1e-06_neg_20_threads_8 of size 100
Training model cpp_model_burn_in_10_epochs_50_eps_1e-06_neg_20_threads_8 of size 200

2.3 Training numpy embeddings (non-gensim)


In [ ]:
python_script_path = os.path.join(parent_directory, np_repo_name, 'poincare.py')

In [ ]:
def np_model_name_from_params(params, prefix):
    param_keys = ['neg', 'epochs']
    name = ['%s_%s' % (key, params[key]) for key in sorted(param_keys)]
    return '%s_%s' % (prefix, '_'.join(name))

def train_external_numpy_model(
    script_path, data_file, output_file, dim, epochs, neg, seed=0):
    """Train a poincare embedding using an external numpy implementation
    
    Args:
        script_path (str): Path to the Python training script
        data_file (str): Path to tsv file containing relation pairs
        output_file (str): Path to output file containing model
        dim (int): Number of dimensions of the trained model
        epochs (int): Number of epochs to use
        neg (int): Number of negative samples to use
    
    Notes: 
        If `output_file` already exists, skips training
    """
    if os.path.exists(output_file):
        print('File %s exists, skipping' % output_file)
        return
    args = {
        'input-file': data_file,
        'output-file': output_file,
        'dimensions': dim,
        'epochs': epochs,
        'learning-rate': 0.01,
        'num-negative': neg,
    }
    cmd = ['python', script_path]
    for option, value in args.items():
        cmd.append("--%s" % option)
        cmd.append(str(value))
    
    return check_output(args=cmd)

In [ ]:
model_files['numpy'] = {}
# Train models with default params
model_name, files = train_model_with_params(default_params, wordnet_file, model_sizes, 'np_model', 'numpy')
model_files['numpy'][model_name] = {}
for dim, filepath in files.items():
    model_files['numpy'][model_name][dim] = filepath


Training model np_model_epochs_50_neg_20 of size 5
Training model np_model_epochs_50_neg_20 of size 10
Training model np_model_epochs_50_neg_20 of size 20
Training model np_model_epochs_50_neg_20 of size 50
Training model np_model_epochs_50_neg_20 of size 100
Training model np_model_epochs_50_neg_20 of size 200

2.4 Training gensim embeddings


In [ ]:
def gensim_model_name_from_params(params, prefix):
    param_keys = ['neg', 'epochs', 'burn_in', 'batch_size', 'reg']
    name = ['%s_%s' % (key, params[key]) for key in sorted(param_keys)]
    return '%s_%s' % (prefix, '_'.join(name))

def train_gensim_model(
    data_file, output_file, dim, epochs, neg, burn_in, batch_size, reg, seed=0):
    """Train a poincare embedding using gensim implementation
    
    Args:
        data_file (str): Path to tsv file containing relation pairs
        output_file (str): Path to output file containing model
        dim (int): Number of dimensions of the trained model
        epochs (int): Number of epochs to use
        neg (int): Number of negative samples to use
        burn_in (int): Number of epochs to use for burn-in initialization
        batch_size (int): Size of batch to use for training
        reg (float): Coefficient used for l2-regularization while training
    
    Notes: 
        If `output_file` already exists, skips training
    """
    if os.path.exists(output_file):
        print('File %s exists, skipping' % output_file)
        return
    train_data = PoincareRelations(data_file)
    model = PoincareModel(train_data, size=dim, negative=neg, burn_in=burn_in, regularization_coeff=reg)
    model.train(epochs=epochs, batch_size=batch_size)
    model.save(output_file)

In [ ]:
non_default_params_gensim = [
    {'neg': 10,},
    {'burn_in': 10,},
    {'batch_size': 50,},
    {'neg': 10, 'reg': 1, 'burn_in': 10, 'epochs': 200},
]

In [ ]:
model_files['gensim'] = {}
# Train models with default params
model_name, files = train_model_with_params(default_params, wordnet_file, model_sizes, 'gensim_model', 'gensim')
model_files['gensim'][model_name] = {}
for dim, filepath in files.items():
    model_files['gensim'][model_name][dim] = filepath
# Train models with non-default params
for new_params in non_default_params_gensim:
    params = default_params.copy()
    params.update(new_params)
    model_name, files = train_model_with_params(params, wordnet_file, model_sizes, 'gensim_model', 'gensim')
    model_files['gensim'][model_name] = {}
    for dim, filepath in files.items():
        model_files['gensim'][model_name][dim] = filepath


INFO:gensim.models.poincare:loading relations from train data..
WARNING:smart_open.smart_open_lib:this function is deprecated, use smart_open.open instead
Training model gensim_model_batch_size_10_burn_in_0_epochs_50_neg_20_reg_0.0 of size 5
INFO:gensim.models.poincare:loaded 743241 relations from train data, 82114 nodes
INFO:gensim.models.poincare:training model of size 5 with 1 workers on 743241 relations for 50 epochs and 0 burn-in epochs, using lr=0.10000 burn-in lr=0.01000 negative=20
INFO:gensim.models.poincare:starting training (50 epochs)----------------------------------------
INFO:gensim.models.poincare:training on epoch 1, examples #9990-#10000, loss: 30.71
INFO:gensim.models.poincare:time taken for 10000 examples: 2.11 s, 4749.76 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #19990-#20000, loss: 30.62
INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4909.96 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #29990-#30000, loss: 30.55
INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4855.20 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #39990-#40000, loss: 30.49
INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4912.49 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #49990-#50000, loss: 30.44
INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4887.18 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #59990-#60000, loss: 30.39
INFO:gensim.models.poincare:time taken for 10000 examples: 2.13 s, 4700.32 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #69990-#70000, loss: 30.35
INFO:gensim.models.poincare:time taken for 10000 examples: 2.07 s, 4834.01 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #79990-#80000, loss: 30.31
INFO:gensim.models.poincare:time taken for 10000 examples: 2.08 s, 4806.10 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #89990-#90000, loss: 30.28
INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5037.74 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #99990-#100000, loss: 30.24
INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5029.46 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #109990-#110000, loss: 30.22
INFO:gensim.models.poincare:time taken for 10000 examples: 1.96 s, 5111.02 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #119990-#120000, loss: 30.20
INFO:gensim.models.poincare:time taken for 10000 examples: 2.03 s, 4915.07 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #129990-#130000, loss: 30.18
INFO:gensim.models.poincare:time taken for 10000 examples: 2.28 s, 4383.69 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #139990-#140000, loss: 30.14
INFO:gensim.models.poincare:time taken for 10000 examples: 2.16 s, 4619.44 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #149990-#150000, loss: 30.12
INFO:gensim.models.poincare:time taken for 10000 examples: 2.23 s, 4489.14 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #159990-#160000, loss: 30.09
INFO:gensim.models.poincare:time taken for 10000 examples: 2.07 s, 4831.86 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #169990-#170000, loss: 30.06
INFO:gensim.models.poincare:time taken for 10000 examples: 1.95 s, 5120.16 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #179990-#180000, loss: 30.06
INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4864.55 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #189990-#190000, loss: 30.03
INFO:gensim.models.poincare:time taken for 10000 examples: 2.23 s, 4480.38 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #199990-#200000, loss: 29.99
INFO:gensim.models.poincare:time taken for 10000 examples: 2.16 s, 4630.69 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #209990-#210000, loss: 29.98
INFO:gensim.models.poincare:time taken for 10000 examples: 2.43 s, 4106.87 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #219990-#220000, loss: 29.96
INFO:gensim.models.poincare:time taken for 10000 examples: 2.18 s, 4587.86 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #229990-#230000, loss: 29.93
INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4963.33 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #239990-#240000, loss: 29.92
INFO:gensim.models.poincare:time taken for 10000 examples: 2.13 s, 4701.92 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #249990-#250000, loss: 29.89
INFO:gensim.models.poincare:time taken for 10000 examples: 2.11 s, 4744.32 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #259990-#260000, loss: 29.88
INFO:gensim.models.poincare:time taken for 10000 examples: 2.30 s, 4344.27 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #269990-#270000, loss: 29.86
INFO:gensim.models.poincare:time taken for 10000 examples: 2.59 s, 3860.48 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #279990-#280000, loss: 29.84
INFO:gensim.models.poincare:time taken for 10000 examples: 2.57 s, 3889.60 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #289990-#290000, loss: 29.81
INFO:gensim.models.poincare:time taken for 10000 examples: 2.29 s, 4357.69 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #299990-#300000, loss: 29.78
INFO:gensim.models.poincare:time taken for 10000 examples: 2.07 s, 4823.16 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #309990-#310000, loss: 29.74
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4990.55 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #319990-#320000, loss: 29.75
INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4864.94 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #329990-#330000, loss: 29.71
INFO:gensim.models.poincare:time taken for 10000 examples: 2.08 s, 4809.07 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #339990-#340000, loss: 29.68
INFO:gensim.models.poincare:time taken for 10000 examples: 2.09 s, 4787.32 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #349990-#350000, loss: 29.68
INFO:gensim.models.poincare:time taken for 10000 examples: 2.25 s, 4451.75 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #359990-#360000, loss: 29.65
INFO:gensim.models.poincare:time taken for 10000 examples: 2.44 s, 4092.97 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #369990-#370000, loss: 29.63
INFO:gensim.models.poincare:time taken for 10000 examples: 2.59 s, 3867.21 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #379990-#380000, loss: 29.62
INFO:gensim.models.poincare:time taken for 10000 examples: 2.51 s, 3980.40 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #389990-#390000, loss: 29.59
INFO:gensim.models.poincare:time taken for 10000 examples: 2.24 s, 4463.61 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #399990-#400000, loss: 29.59
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5001.52 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #409990-#410000, loss: 29.55
INFO:gensim.models.poincare:time taken for 10000 examples: 2.10 s, 4754.73 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #419990-#420000, loss: 29.53
INFO:gensim.models.poincare:time taken for 10000 examples: 2.21 s, 4533.89 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #429990-#430000, loss: 29.47
INFO:gensim.models.poincare:time taken for 10000 examples: 2.09 s, 4778.84 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #439990-#440000, loss: 29.46
INFO:gensim.models.poincare:time taken for 10000 examples: 2.18 s, 4586.84 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #449990-#450000, loss: 29.46
INFO:gensim.models.poincare:time taken for 10000 examples: 2.12 s, 4713.23 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #459990-#460000, loss: 29.43
INFO:gensim.models.poincare:time taken for 10000 examples: 2.03 s, 4918.35 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #469990-#470000, loss: 29.41
INFO:gensim.models.poincare:time taken for 10000 examples: 2.14 s, 4671.67 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #479990-#480000, loss: 29.41
INFO:gensim.models.poincare:time taken for 10000 examples: 2.14 s, 4679.77 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #489990-#490000, loss: 29.38
INFO:gensim.models.poincare:time taken for 10000 examples: 2.20 s, 4545.91 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #499990-#500000, loss: 29.35
INFO:gensim.models.poincare:time taken for 10000 examples: 2.52 s, 3969.84 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #509990-#510000, loss: 29.30
INFO:gensim.models.poincare:time taken for 10000 examples: 2.38 s, 4198.87 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #519990-#520000, loss: 29.32
INFO:gensim.models.poincare:time taken for 10000 examples: 2.44 s, 4092.34 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #529990-#530000, loss: 29.27
INFO:gensim.models.poincare:time taken for 10000 examples: 2.57 s, 3885.30 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #539990-#540000, loss: 29.26
INFO:gensim.models.poincare:time taken for 10000 examples: 2.07 s, 4838.04 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #549990-#550000, loss: 29.21
INFO:gensim.models.poincare:time taken for 10000 examples: 2.88 s, 3469.95 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #559990-#560000, loss: 29.18
INFO:gensim.models.poincare:time taken for 10000 examples: 2.95 s, 3389.43 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #569990-#570000, loss: 29.17
INFO:gensim.models.poincare:time taken for 10000 examples: 2.67 s, 3742.41 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #579990-#580000, loss: 29.12
INFO:gensim.models.poincare:time taken for 10000 examples: 2.98 s, 3351.91 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #589990-#590000, loss: 29.10
INFO:gensim.models.poincare:time taken for 10000 examples: 2.95 s, 3387.14 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #599990-#600000, loss: 29.07
INFO:gensim.models.poincare:time taken for 10000 examples: 3.27 s, 3054.74 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #609990-#610000, loss: 29.04
INFO:gensim.models.poincare:time taken for 10000 examples: 2.82 s, 3551.59 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #619990-#620000, loss: 29.00
INFO:gensim.models.poincare:time taken for 10000 examples: 2.94 s, 3402.01 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #629990-#630000, loss: 29.00
INFO:gensim.models.poincare:time taken for 10000 examples: 2.53 s, 3948.00 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #639990-#640000, loss: 28.97
INFO:gensim.models.poincare:time taken for 10000 examples: 3.26 s, 3068.72 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #649990-#650000, loss: 28.91
INFO:gensim.models.poincare:time taken for 10000 examples: 2.73 s, 3663.22 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #659990-#660000, loss: 28.94
INFO:gensim.models.poincare:time taken for 10000 examples: 3.47 s, 2879.20 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #669990-#670000, loss: 28.88
INFO:gensim.models.poincare:time taken for 10000 examples: 2.73 s, 3662.08 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #679990-#680000, loss: 28.85
INFO:gensim.models.poincare:time taken for 10000 examples: 2.91 s, 3432.97 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #689990-#690000, loss: 28.84
INFO:gensim.models.poincare:time taken for 10000 examples: 3.12 s, 3202.25 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #699990-#700000, loss: 28.77
INFO:gensim.models.poincare:time taken for 10000 examples: 3.52 s, 2837.12 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #709990-#710000, loss: 28.72
INFO:gensim.models.poincare:time taken for 10000 examples: 2.91 s, 3437.84 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #719990-#720000, loss: 28.72
INFO:gensim.models.poincare:time taken for 10000 examples: 2.34 s, 4277.28 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #729990-#730000, loss: 28.68
INFO:gensim.models.poincare:time taken for 10000 examples: 3.27 s, 3056.02 examples / s
INFO:gensim.models.poincare:training on epoch 1, examples #739990-#740000, loss: 28.65
INFO:gensim.models.poincare:time taken for 10000 examples: 2.59 s, 3862.10 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #9990-#10000, loss: 28.36
INFO:gensim.models.poincare:time taken for 10000 examples: 3.32 s, 3015.39 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #19990-#20000, loss: 28.38
INFO:gensim.models.poincare:time taken for 10000 examples: 2.44 s, 4098.29 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #29990-#30000, loss: 28.33
INFO:gensim.models.poincare:time taken for 10000 examples: 3.24 s, 3082.26 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #39990-#40000, loss: 28.24
INFO:gensim.models.poincare:time taken for 10000 examples: 2.37 s, 4215.99 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #49990-#50000, loss: 28.24
INFO:gensim.models.poincare:time taken for 10000 examples: 3.26 s, 3065.69 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #59990-#60000, loss: 28.20
INFO:gensim.models.poincare:time taken for 10000 examples: 2.47 s, 4056.48 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #69990-#70000, loss: 28.14
INFO:gensim.models.poincare:time taken for 10000 examples: 2.92 s, 3424.05 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #79990-#80000, loss: 28.11
INFO:gensim.models.poincare:time taken for 10000 examples: 3.10 s, 3225.82 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #89990-#90000, loss: 28.12
INFO:gensim.models.poincare:time taken for 10000 examples: 2.87 s, 3485.15 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #99990-#100000, loss: 28.07
INFO:gensim.models.poincare:time taken for 10000 examples: 3.04 s, 3290.47 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #109990-#110000, loss: 28.01
INFO:gensim.models.poincare:time taken for 10000 examples: 2.30 s, 4350.37 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #119990-#120000, loss: 27.98
INFO:gensim.models.poincare:time taken for 10000 examples: 2.93 s, 3410.48 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #129990-#130000, loss: 27.93
INFO:gensim.models.poincare:time taken for 10000 examples: 2.88 s, 3476.53 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #139990-#140000, loss: 27.88
INFO:gensim.models.poincare:time taken for 10000 examples: 2.85 s, 3509.82 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #149990-#150000, loss: 27.86
INFO:gensim.models.poincare:time taken for 10000 examples: 2.89 s, 3455.85 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #159990-#160000, loss: 27.81
INFO:gensim.models.poincare:time taken for 10000 examples: 2.47 s, 4049.47 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #169990-#170000, loss: 27.80
INFO:gensim.models.poincare:time taken for 10000 examples: 2.89 s, 3462.15 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #179990-#180000, loss: 27.76
INFO:gensim.models.poincare:time taken for 10000 examples: 3.02 s, 3309.25 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #189990-#190000, loss: 27.72
INFO:gensim.models.poincare:time taken for 10000 examples: 2.40 s, 4170.28 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #199990-#200000, loss: 27.67
INFO:gensim.models.poincare:time taken for 10000 examples: 3.29 s, 3041.13 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #209990-#210000, loss: 27.59
INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4856.18 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #219990-#220000, loss: 27.52
INFO:gensim.models.poincare:time taken for 10000 examples: 3.05 s, 3280.16 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #229990-#230000, loss: 27.55
INFO:gensim.models.poincare:time taken for 10000 examples: 2.67 s, 3743.50 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #239990-#240000, loss: 27.49
INFO:gensim.models.poincare:time taken for 10000 examples: 2.74 s, 3646.28 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #249990-#250000, loss: 27.43
INFO:gensim.models.poincare:time taken for 10000 examples: 2.97 s, 3371.91 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #259990-#260000, loss: 27.42
INFO:gensim.models.poincare:time taken for 10000 examples: 2.14 s, 4667.36 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #269990-#270000, loss: 27.34
INFO:gensim.models.poincare:time taken for 10000 examples: 2.97 s, 3368.10 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #279990-#280000, loss: 27.31
INFO:gensim.models.poincare:time taken for 10000 examples: 2.79 s, 3580.47 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #289990-#290000, loss: 27.29
INFO:gensim.models.poincare:time taken for 10000 examples: 2.61 s, 3830.69 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #299990-#300000, loss: 27.24
INFO:gensim.models.poincare:time taken for 10000 examples: 3.17 s, 3150.43 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #309990-#310000, loss: 27.21
INFO:gensim.models.poincare:time taken for 10000 examples: 2.44 s, 4090.56 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #319990-#320000, loss: 27.17
INFO:gensim.models.poincare:time taken for 10000 examples: 2.84 s, 3520.51 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #329990-#330000, loss: 27.07
INFO:gensim.models.poincare:time taken for 10000 examples: 3.06 s, 3264.34 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #339990-#340000, loss: 27.06
INFO:gensim.models.poincare:time taken for 10000 examples: 2.61 s, 3826.93 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #349990-#350000, loss: 26.97
INFO:gensim.models.poincare:time taken for 10000 examples: 3.00 s, 3328.54 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #359990-#360000, loss: 26.94
INFO:gensim.models.poincare:time taken for 10000 examples: 3.12 s, 3207.03 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #369990-#370000, loss: 26.87
INFO:gensim.models.poincare:time taken for 10000 examples: 2.33 s, 4296.87 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #379990-#380000, loss: 26.88
INFO:gensim.models.poincare:time taken for 10000 examples: 2.11 s, 4729.59 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #389990-#390000, loss: 26.78
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4997.96 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #399990-#400000, loss: 26.77
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4942.94 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #409990-#410000, loss: 26.69
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4938.90 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #419990-#420000, loss: 26.69
INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4861.93 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #429990-#430000, loss: 26.69
INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5017.22 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #439990-#440000, loss: 26.55
INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5031.36 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #449990-#450000, loss: 26.53
INFO:gensim.models.poincare:time taken for 10000 examples: 1.97 s, 5088.95 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #459990-#460000, loss: 26.56
INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5020.40 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #469990-#470000, loss: 26.41
INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5017.54 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #479990-#480000, loss: 26.40
INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4909.68 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #489990-#490000, loss: 26.28
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4943.52 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #499990-#500000, loss: 26.20
INFO:gensim.models.poincare:time taken for 10000 examples: 2.09 s, 4794.14 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #509990-#510000, loss: 26.19
INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4875.88 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #519990-#520000, loss: 26.21
INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5053.06 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #529990-#530000, loss: 26.08
INFO:gensim.models.poincare:time taken for 10000 examples: 1.97 s, 5075.26 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #539990-#540000, loss: 26.09
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4992.10 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #549990-#550000, loss: 26.09
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4949.35 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #559990-#560000, loss: 25.96
INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4884.55 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #569990-#570000, loss: 25.94
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4952.91 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #579990-#580000, loss: 25.85
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4944.52 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #589990-#590000, loss: 25.86
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4940.39 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #599990-#600000, loss: 25.77
INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5019.82 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #609990-#610000, loss: 25.73
INFO:gensim.models.poincare:time taken for 10000 examples: 2.03 s, 4921.34 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #619990-#620000, loss: 25.66
INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4970.34 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #629990-#630000, loss: 25.61
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4992.80 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #639990-#640000, loss: 25.55
INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4969.25 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #649990-#650000, loss: 25.57
INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5024.53 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #659990-#660000, loss: 25.53
INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4898.76 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #669990-#670000, loss: 25.38
INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5047.43 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #679990-#680000, loss: 25.42
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4961.59 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #689990-#690000, loss: 25.30
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4949.78 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #699990-#700000, loss: 25.27
INFO:gensim.models.poincare:time taken for 10000 examples: 2.03 s, 4917.41 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #709990-#710000, loss: 25.24
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5010.41 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #719990-#720000, loss: 25.14
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4948.47 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #729990-#730000, loss: 25.10
INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5015.81 examples / s
INFO:gensim.models.poincare:training on epoch 2, examples #739990-#740000, loss: 25.10
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4997.04 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #9990-#10000, loss: 24.59
INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4983.47 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #19990-#20000, loss: 24.52
INFO:gensim.models.poincare:time taken for 10000 examples: 1.94 s, 5148.98 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #29990-#30000, loss: 24.51
INFO:gensim.models.poincare:time taken for 10000 examples: 2.03 s, 4915.44 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #39990-#40000, loss: 24.37
INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5037.22 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #49990-#50000, loss: 24.37
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4995.40 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #59990-#60000, loss: 24.29
INFO:gensim.models.poincare:time taken for 10000 examples: 1.97 s, 5063.77 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #69990-#70000, loss: 24.23
INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4906.60 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #79990-#80000, loss: 24.14
INFO:gensim.models.poincare:time taken for 10000 examples: 1.97 s, 5077.38 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #89990-#90000, loss: 24.08
INFO:gensim.models.poincare:time taken for 10000 examples: 1.96 s, 5107.00 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #99990-#100000, loss: 24.04
INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5035.09 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #109990-#110000, loss: 24.03
INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5060.39 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #119990-#120000, loss: 24.12
INFO:gensim.models.poincare:time taken for 10000 examples: 1.95 s, 5136.46 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #129990-#130000, loss: 23.92
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4943.59 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #139990-#140000, loss: 23.88
INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4857.96 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #149990-#150000, loss: 23.83
INFO:gensim.models.poincare:time taken for 10000 examples: 2.23 s, 4483.37 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #159990-#160000, loss: 23.79
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4960.81 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #169990-#170000, loss: 23.77
INFO:gensim.models.poincare:time taken for 10000 examples: 2.13 s, 4685.71 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #179990-#180000, loss: 23.65
INFO:gensim.models.poincare:time taken for 10000 examples: 2.12 s, 4726.91 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #189990-#190000, loss: 23.64
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4943.85 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #199990-#200000, loss: 23.56
INFO:gensim.models.poincare:time taken for 10000 examples: 2.08 s, 4797.30 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #209990-#210000, loss: 23.51
INFO:gensim.models.poincare:time taken for 10000 examples: 2.24 s, 4458.22 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #219990-#220000, loss: 23.47
INFO:gensim.models.poincare:time taken for 10000 examples: 2.08 s, 4800.01 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #229990-#230000, loss: 23.41
INFO:gensim.models.poincare:time taken for 10000 examples: 2.09 s, 4777.22 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #239990-#240000, loss: 23.37
INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5039.90 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #249990-#250000, loss: 23.32
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5003.08 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #259990-#260000, loss: 23.26
INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4905.22 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #269990-#270000, loss: 23.23
INFO:gensim.models.poincare:time taken for 10000 examples: 2.59 s, 3856.47 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #279990-#280000, loss: 23.18
INFO:gensim.models.poincare:time taken for 10000 examples: 2.83 s, 3532.78 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #289990-#290000, loss: 23.16
INFO:gensim.models.poincare:time taken for 10000 examples: 2.45 s, 4085.00 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #299990-#300000, loss: 23.10
INFO:gensim.models.poincare:time taken for 10000 examples: 2.26 s, 4422.89 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #309990-#310000, loss: 22.97
INFO:gensim.models.poincare:time taken for 10000 examples: 2.32 s, 4316.61 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #319990-#320000, loss: 22.88
INFO:gensim.models.poincare:time taken for 10000 examples: 2.14 s, 4681.67 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #329990-#330000, loss: 22.85
INFO:gensim.models.poincare:time taken for 10000 examples: 2.14 s, 4678.16 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #339990-#340000, loss: 22.82
INFO:gensim.models.poincare:time taken for 10000 examples: 2.12 s, 4718.35 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #349990-#350000, loss: 22.81
INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4889.15 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #359990-#360000, loss: 22.77
INFO:gensim.models.poincare:time taken for 10000 examples: 2.03 s, 4915.44 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #369990-#370000, loss: 22.72
INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4908.47 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #379990-#380000, loss: 22.70
INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4965.93 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #389990-#390000, loss: 22.63
INFO:gensim.models.poincare:time taken for 10000 examples: 2.14 s, 4671.89 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #399990-#400000, loss: 22.54
INFO:gensim.models.poincare:time taken for 10000 examples: 2.07 s, 4838.80 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #409990-#410000, loss: 22.47
INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4848.72 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #419990-#420000, loss: 22.50
INFO:gensim.models.poincare:time taken for 10000 examples: 2.16 s, 4627.03 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #429990-#430000, loss: 22.39
INFO:gensim.models.poincare:time taken for 10000 examples: 2.13 s, 4696.29 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #439990-#440000, loss: 22.42
INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4847.23 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #449990-#450000, loss: 22.29
INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4871.52 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #459990-#460000, loss: 22.21
INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4969.33 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #469990-#470000, loss: 22.12
INFO:gensim.models.poincare:time taken for 10000 examples: 2.03 s, 4919.60 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #479990-#480000, loss: 22.11
INFO:gensim.models.poincare:time taken for 10000 examples: 2.09 s, 4782.22 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #489990-#490000, loss: 22.12
INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4882.60 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #499990-#500000, loss: 22.04
INFO:gensim.models.poincare:time taken for 10000 examples: 2.20 s, 4542.77 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #509990-#510000, loss: 22.02
INFO:gensim.models.poincare:time taken for 10000 examples: 2.18 s, 4593.91 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #519990-#520000, loss: 21.89
INFO:gensim.models.poincare:time taken for 10000 examples: 2.26 s, 4420.52 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #529990-#530000, loss: 21.98
INFO:gensim.models.poincare:time taken for 10000 examples: 1.95 s, 5115.37 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #539990-#540000, loss: 21.92
INFO:gensim.models.poincare:time taken for 10000 examples: 2.09 s, 4781.64 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #549990-#550000, loss: 21.68
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4948.20 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #559990-#560000, loss: 21.69
INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4868.35 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #569990-#570000, loss: 21.76
INFO:gensim.models.poincare:time taken for 10000 examples: 2.14 s, 4670.71 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #579990-#580000, loss: 21.65
INFO:gensim.models.poincare:time taken for 10000 examples: 2.13 s, 4693.07 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #589990-#590000, loss: 21.52
INFO:gensim.models.poincare:time taken for 10000 examples: 2.24 s, 4474.16 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #599990-#600000, loss: 21.50
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4952.18 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #609990-#610000, loss: 21.52
INFO:gensim.models.poincare:time taken for 10000 examples: 2.24 s, 4468.84 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #619990-#620000, loss: 21.51
INFO:gensim.models.poincare:time taken for 10000 examples: 2.22 s, 4507.36 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #629990-#630000, loss: 21.28
INFO:gensim.models.poincare:time taken for 10000 examples: 2.51 s, 3978.63 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #639990-#640000, loss: 21.35
INFO:gensim.models.poincare:time taken for 10000 examples: 2.07 s, 4829.42 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #649990-#650000, loss: 21.35
INFO:gensim.models.poincare:time taken for 10000 examples: 2.12 s, 4726.57 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #659990-#660000, loss: 21.22
INFO:gensim.models.poincare:time taken for 10000 examples: 2.25 s, 4453.09 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #669990-#670000, loss: 21.20
INFO:gensim.models.poincare:time taken for 10000 examples: 2.20 s, 4547.93 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #679990-#680000, loss: 21.09
INFO:gensim.models.poincare:time taken for 10000 examples: 2.13 s, 4698.17 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #689990-#690000, loss: 21.04
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4962.71 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #699990-#700000, loss: 21.07
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4954.70 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #709990-#710000, loss: 21.03
INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4977.30 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #719990-#720000, loss: 20.85
INFO:gensim.models.poincare:time taken for 10000 examples: 2.21 s, 4532.46 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #729990-#730000, loss: 20.99
INFO:gensim.models.poincare:time taken for 10000 examples: 2.26 s, 4421.55 examples / s
INFO:gensim.models.poincare:training on epoch 3, examples #739990-#740000, loss: 20.82
INFO:gensim.models.poincare:time taken for 10000 examples: 2.08 s, 4812.09 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #9990-#10000, loss: 20.28
INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4880.65 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #19990-#20000, loss: 20.39
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4995.67 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #29990-#30000, loss: 20.30
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4995.43 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #39990-#40000, loss: 20.26
INFO:gensim.models.poincare:time taken for 10000 examples: 2.10 s, 4761.39 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #49990-#50000, loss: 20.22
INFO:gensim.models.poincare:time taken for 10000 examples: 2.09 s, 4795.74 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #59990-#60000, loss: 20.10
INFO:gensim.models.poincare:time taken for 10000 examples: 2.11 s, 4747.03 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #69990-#70000, loss: 20.02
INFO:gensim.models.poincare:time taken for 10000 examples: 2.07 s, 4829.93 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #79990-#80000, loss: 20.11
INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5044.28 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #89990-#90000, loss: 19.95
INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4863.30 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #99990-#100000, loss: 20.04
INFO:gensim.models.poincare:time taken for 10000 examples: 2.14 s, 4678.92 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #109990-#110000, loss: 20.00
INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4888.53 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #119990-#120000, loss: 19.85
INFO:gensim.models.poincare:time taken for 10000 examples: 2.14 s, 4667.48 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #129990-#130000, loss: 19.87
INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4901.64 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #139990-#140000, loss: 19.87
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5002.31 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #149990-#150000, loss: 19.85
INFO:gensim.models.poincare:time taken for 10000 examples: 2.03 s, 4938.18 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #159990-#160000, loss: 19.76
INFO:gensim.models.poincare:time taken for 10000 examples: 1.97 s, 5070.00 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #169990-#170000, loss: 19.64
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4995.07 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #179990-#180000, loss: 19.64
INFO:gensim.models.poincare:time taken for 10000 examples: 2.12 s, 4716.83 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #189990-#190000, loss: 19.54
INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4864.98 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #199990-#200000, loss: 19.46
INFO:gensim.models.poincare:time taken for 10000 examples: 2.13 s, 4698.48 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #209990-#210000, loss: 19.57
INFO:gensim.models.poincare:time taken for 10000 examples: 2.19 s, 4563.29 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #219990-#220000, loss: 19.36
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4943.55 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #229990-#230000, loss: 19.31
INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4871.79 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #239990-#240000, loss: 19.21
INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4982.23 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #249990-#250000, loss: 19.27
INFO:gensim.models.poincare:time taken for 10000 examples: 2.10 s, 4763.58 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #259990-#260000, loss: 19.33
INFO:gensim.models.poincare:time taken for 10000 examples: 2.09 s, 4789.60 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #269990-#270000, loss: 19.23
INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4979.62 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #279990-#280000, loss: 19.12
INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4865.48 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #289990-#290000, loss: 19.27
INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4861.66 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #299990-#300000, loss: 19.08
INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4968.50 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #309990-#310000, loss: 19.09
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4944.28 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #319990-#320000, loss: 18.95
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4959.73 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #329990-#330000, loss: 18.94
INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4985.03 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #339990-#340000, loss: 18.95
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5007.82 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #349990-#350000, loss: 18.92
INFO:gensim.models.poincare:time taken for 10000 examples: 2.10 s, 4756.73 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #359990-#360000, loss: 18.95
INFO:gensim.models.poincare:time taken for 10000 examples: 2.07 s, 4833.77 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #369990-#370000, loss: 18.82
INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4968.78 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #379990-#380000, loss: 18.71
INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5060.18 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #389990-#390000, loss: 18.67
INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5039.61 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #399990-#400000, loss: 18.61
INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5059.70 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #409990-#410000, loss: 18.69
INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5017.67 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #419990-#420000, loss: 18.57
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5007.73 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #429990-#430000, loss: 18.42
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4954.07 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #439990-#440000, loss: 18.47
INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4969.06 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #449990-#450000, loss: 18.53
INFO:gensim.models.poincare:time taken for 10000 examples: 2.17 s, 4605.69 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #459990-#460000, loss: 18.47
INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5060.19 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #469990-#470000, loss: 18.33
INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4981.71 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #479990-#480000, loss: 18.37
INFO:gensim.models.poincare:time taken for 10000 examples: 2.07 s, 4841.89 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #489990-#490000, loss: 18.26
INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4850.45 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #499990-#500000, loss: 18.28
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4941.40 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #509990-#510000, loss: 18.35
INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4887.96 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #519990-#520000, loss: 18.12
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5001.63 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #529990-#530000, loss: 18.17
INFO:gensim.models.poincare:time taken for 10000 examples: 2.07 s, 4822.19 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #539990-#540000, loss: 18.16
INFO:gensim.models.poincare:time taken for 10000 examples: 2.07 s, 4833.48 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #549990-#550000, loss: 18.18
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5004.34 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #559990-#560000, loss: 18.10
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5011.97 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #569990-#570000, loss: 17.88
INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4852.27 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #579990-#580000, loss: 17.93
INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4894.56 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #589990-#590000, loss: 17.89
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4941.00 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #599990-#600000, loss: 17.77
INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5038.40 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #609990-#610000, loss: 17.69
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4938.77 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #619990-#620000, loss: 17.72
INFO:gensim.models.poincare:time taken for 10000 examples: 2.08 s, 4813.41 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #629990-#630000, loss: 17.69
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4943.75 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #639990-#640000, loss: 17.75
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4949.58 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #649990-#650000, loss: 17.69
INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4986.85 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #659990-#660000, loss: 17.69
INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4977.14 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #669990-#670000, loss: 17.66
INFO:gensim.models.poincare:time taken for 10000 examples: 1.97 s, 5064.44 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #679990-#680000, loss: 17.49
INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4986.14 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #689990-#690000, loss: 17.51
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4998.18 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #699990-#700000, loss: 17.47
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4939.02 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #709990-#710000, loss: 17.48
INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5034.09 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #719990-#720000, loss: 17.35
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4990.34 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #729990-#730000, loss: 17.43
INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4978.69 examples / s
INFO:gensim.models.poincare:training on epoch 4, examples #739990-#740000, loss: 17.44
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4946.35 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #9990-#10000, loss: 16.98
INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4967.11 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #19990-#20000, loss: 17.01
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5006.20 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #29990-#30000, loss: 17.03
INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4878.62 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #39990-#40000, loss: 16.97
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4962.48 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #49990-#50000, loss: 16.87
INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4861.49 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #59990-#60000, loss: 16.80
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4943.24 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #69990-#70000, loss: 16.74
INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5053.07 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #79990-#80000, loss: 16.77
INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5034.57 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #89990-#90000, loss: 16.77
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4990.77 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #99990-#100000, loss: 16.80
INFO:gensim.models.poincare:time taken for 10000 examples: 2.03 s, 4928.06 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #109990-#110000, loss: 16.69
INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4864.58 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #119990-#120000, loss: 16.48
INFO:gensim.models.poincare:time taken for 10000 examples: 2.09 s, 4793.82 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #129990-#130000, loss: 16.78
INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5056.86 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #139990-#140000, loss: 16.55
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4999.35 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #149990-#150000, loss: 16.46
INFO:gensim.models.poincare:time taken for 10000 examples: 2.13 s, 4693.17 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #159990-#160000, loss: 16.58
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5005.50 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #169990-#170000, loss: 16.43
INFO:gensim.models.poincare:time taken for 10000 examples: 1.97 s, 5064.80 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #179990-#180000, loss: 16.39
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4995.10 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #189990-#190000, loss: 16.31
INFO:gensim.models.poincare:time taken for 10000 examples: 2.08 s, 4815.94 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #199990-#200000, loss: 16.29
INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4911.52 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #209990-#210000, loss: 16.41
INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4846.22 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #219990-#220000, loss: 16.23
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4998.19 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #229990-#230000, loss: 16.35
INFO:gensim.models.poincare:time taken for 10000 examples: 2.09 s, 4779.60 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #239990-#240000, loss: 16.28
INFO:gensim.models.poincare:time taken for 10000 examples: 2.10 s, 4753.24 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #249990-#250000, loss: 16.10
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4993.15 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #259990-#260000, loss: 16.15
INFO:gensim.models.poincare:time taken for 10000 examples: 2.07 s, 4828.99 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #269990-#270000, loss: 16.17
INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5063.10 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #279990-#280000, loss: 16.13
INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4863.24 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #289990-#290000, loss: 16.05
INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4868.74 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #299990-#300000, loss: 16.04
INFO:gensim.models.poincare:time taken for 10000 examples: 2.07 s, 4842.28 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #309990-#310000, loss: 15.97
INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5016.07 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #319990-#320000, loss: 16.04
INFO:gensim.models.poincare:time taken for 10000 examples: 2.10 s, 4759.84 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #329990-#330000, loss: 15.98
INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4971.92 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #339990-#340000, loss: 16.00
INFO:gensim.models.poincare:time taken for 10000 examples: 2.03 s, 4928.27 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #349990-#350000, loss: 15.84
INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5048.23 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #359990-#360000, loss: 15.91
INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4976.53 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #369990-#370000, loss: 15.83
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4948.23 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #379990-#380000, loss: 15.80
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4957.56 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #389990-#390000, loss: 15.92
INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4904.25 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #399990-#400000, loss: 15.75
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4995.65 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #409990-#410000, loss: 15.78
INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4963.78 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #419990-#420000, loss: 15.70
INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4888.36 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #429990-#430000, loss: 15.68
INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4911.60 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #439990-#440000, loss: 15.64
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4952.78 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #449990-#450000, loss: 15.52
INFO:gensim.models.poincare:time taken for 10000 examples: 2.12 s, 4718.29 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #459990-#460000, loss: 15.54
INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4987.42 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #469990-#470000, loss: 15.63
INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5033.39 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #479990-#480000, loss: 15.70
INFO:gensim.models.poincare:time taken for 10000 examples: 2.07 s, 4838.73 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #489990-#490000, loss: 15.51
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5000.57 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #499990-#500000, loss: 15.51
INFO:gensim.models.poincare:time taken for 10000 examples: 2.13 s, 4689.02 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #509990-#510000, loss: 15.32
INFO:gensim.models.poincare:time taken for 10000 examples: 2.15 s, 4652.77 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #519990-#520000, loss: 15.47
INFO:gensim.models.poincare:time taken for 10000 examples: 2.18 s, 4581.93 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #529990-#530000, loss: 15.21
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4957.70 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #539990-#540000, loss: 15.34
INFO:gensim.models.poincare:time taken for 10000 examples: 2.11 s, 4730.20 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #549990-#550000, loss: 15.19
INFO:gensim.models.poincare:time taken for 10000 examples: 2.36 s, 4237.03 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #559990-#560000, loss: 15.31
INFO:gensim.models.poincare:time taken for 10000 examples: 2.10 s, 4771.29 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #569990-#570000, loss: 15.16
INFO:gensim.models.poincare:time taken for 10000 examples: 2.12 s, 4707.25 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #579990-#580000, loss: 15.26
INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4868.55 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #589990-#590000, loss: 15.25
INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4853.89 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #599990-#600000, loss: 15.19
INFO:gensim.models.poincare:time taken for 10000 examples: 2.18 s, 4595.38 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #609990-#610000, loss: 15.08
INFO:gensim.models.poincare:time taken for 10000 examples: 2.03 s, 4916.04 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #619990-#620000, loss: 15.25
INFO:gensim.models.poincare:time taken for 10000 examples: 2.16 s, 4620.15 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #629990-#630000, loss: 14.94
INFO:gensim.models.poincare:time taken for 10000 examples: 2.29 s, 4372.28 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #639990-#640000, loss: 15.02
INFO:gensim.models.poincare:time taken for 10000 examples: 2.26 s, 4420.59 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #649990-#650000, loss: 15.05
INFO:gensim.models.poincare:time taken for 10000 examples: 2.18 s, 4584.97 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #659990-#660000, loss: 15.13
INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4846.84 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #669990-#670000, loss: 14.92
INFO:gensim.models.poincare:time taken for 10000 examples: 2.03 s, 4919.27 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #679990-#680000, loss: 14.92
INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4985.22 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #689990-#690000, loss: 15.05
INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5057.07 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #699990-#700000, loss: 14.92
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5000.94 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #709990-#710000, loss: 14.95
INFO:gensim.models.poincare:time taken for 10000 examples: 2.07 s, 4840.27 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #719990-#720000, loss: 14.87
INFO:gensim.models.poincare:time taken for 10000 examples: 2.10 s, 4770.30 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #729990-#730000, loss: 14.76
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5007.93 examples / s
INFO:gensim.models.poincare:training on epoch 5, examples #739990-#740000, loss: 14.70
INFO:gensim.models.poincare:time taken for 10000 examples: 1.94 s, 5149.87 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #9990-#10000, loss: 14.62
INFO:gensim.models.poincare:time taken for 10000 examples: 1.97 s, 5074.41 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #19990-#20000, loss: 14.57
INFO:gensim.models.poincare:time taken for 10000 examples: 1.97 s, 5068.34 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #29990-#30000, loss: 14.56
INFO:gensim.models.poincare:time taken for 10000 examples: 2.12 s, 4711.80 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #39990-#40000, loss: 14.53
INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4901.93 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #49990-#50000, loss: 14.38
INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4972.95 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #59990-#60000, loss: 14.40
INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4912.06 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #69990-#70000, loss: 14.48
INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4845.06 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #79990-#80000, loss: 14.47
INFO:gensim.models.poincare:time taken for 10000 examples: 2.19 s, 4567.18 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #89990-#90000, loss: 14.46
INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4905.89 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #99990-#100000, loss: 14.27
INFO:gensim.models.poincare:time taken for 10000 examples: 2.12 s, 4717.66 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #109990-#110000, loss: 14.32
INFO:gensim.models.poincare:time taken for 10000 examples: 2.18 s, 4583.40 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #119990-#120000, loss: 14.38
INFO:gensim.models.poincare:time taken for 10000 examples: 2.09 s, 4773.44 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #129990-#130000, loss: 14.30
INFO:gensim.models.poincare:time taken for 10000 examples: 2.11 s, 4737.11 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #139990-#140000, loss: 14.29
INFO:gensim.models.poincare:time taken for 10000 examples: 2.25 s, 4445.51 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #149990-#150000, loss: 14.21
INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5039.20 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #159990-#160000, loss: 14.25
INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4890.26 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #169990-#170000, loss: 14.30
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4938.33 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #179990-#180000, loss: 14.23
INFO:gensim.models.poincare:time taken for 10000 examples: 2.16 s, 4632.99 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #189990-#190000, loss: 14.20
INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4979.41 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #199990-#200000, loss: 14.22
INFO:gensim.models.poincare:time taken for 10000 examples: 2.10 s, 4762.01 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #209990-#210000, loss: 14.18
INFO:gensim.models.poincare:time taken for 10000 examples: 2.19 s, 4567.59 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #219990-#220000, loss: 14.07
INFO:gensim.models.poincare:time taken for 10000 examples: 2.24 s, 4461.97 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #229990-#230000, loss: 14.31
INFO:gensim.models.poincare:time taken for 10000 examples: 2.22 s, 4504.80 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #239990-#240000, loss: 14.14
INFO:gensim.models.poincare:time taken for 10000 examples: 2.20 s, 4535.80 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #249990-#250000, loss: 14.15
INFO:gensim.models.poincare:time taken for 10000 examples: 2.57 s, 3890.63 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #259990-#260000, loss: 13.95
INFO:gensim.models.poincare:time taken for 10000 examples: 2.19 s, 4569.33 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #269990-#270000, loss: 14.05
INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4844.90 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #279990-#280000, loss: 14.12
INFO:gensim.models.poincare:time taken for 10000 examples: 2.14 s, 4676.96 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #289990-#290000, loss: 14.04
INFO:gensim.models.poincare:time taken for 10000 examples: 2.08 s, 4810.09 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #299990-#300000, loss: 13.99
INFO:gensim.models.poincare:time taken for 10000 examples: 2.21 s, 4519.08 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #309990-#310000, loss: 13.95
INFO:gensim.models.poincare:time taken for 10000 examples: 2.16 s, 4638.66 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #319990-#320000, loss: 13.86
INFO:gensim.models.poincare:time taken for 10000 examples: 2.18 s, 4585.32 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #329990-#330000, loss: 13.90
INFO:gensim.models.poincare:time taken for 10000 examples: 2.24 s, 4468.27 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #339990-#340000, loss: 13.89
INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4983.74 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #349990-#350000, loss: 13.86
INFO:gensim.models.poincare:time taken for 10000 examples: 2.15 s, 4658.19 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #359990-#360000, loss: 13.79
INFO:gensim.models.poincare:time taken for 10000 examples: 2.14 s, 4673.10 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #369990-#370000, loss: 13.77
INFO:gensim.models.poincare:time taken for 10000 examples: 2.17 s, 4613.05 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #379990-#380000, loss: 13.63
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4992.76 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #389990-#390000, loss: 13.92
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5000.60 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #399990-#400000, loss: 13.75
INFO:gensim.models.poincare:time taken for 10000 examples: 2.12 s, 4716.74 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #409990-#410000, loss: 13.76
INFO:gensim.models.poincare:time taken for 10000 examples: 2.10 s, 4762.14 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #419990-#420000, loss: 13.63
INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4856.55 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #429990-#430000, loss: 13.71
INFO:gensim.models.poincare:time taken for 10000 examples: 2.43 s, 4115.02 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #439990-#440000, loss: 13.55
INFO:gensim.models.poincare:time taken for 10000 examples: 2.30 s, 4340.13 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #449990-#450000, loss: 13.70
INFO:gensim.models.poincare:time taken for 10000 examples: 2.35 s, 4255.26 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #459990-#460000, loss: 13.60
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4991.39 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #469990-#470000, loss: 13.51
INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4979.94 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #479990-#480000, loss: 13.64
INFO:gensim.models.poincare:time taken for 10000 examples: 2.13 s, 4703.96 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #489990-#490000, loss: 13.60
INFO:gensim.models.poincare:time taken for 10000 examples: 2.21 s, 4524.26 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #499990-#500000, loss: 13.61
INFO:gensim.models.poincare:time taken for 10000 examples: 2.08 s, 4815.57 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #509990-#510000, loss: 13.70
INFO:gensim.models.poincare:time taken for 10000 examples: 2.08 s, 4812.18 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #519990-#520000, loss: 13.45
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5008.45 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #529990-#530000, loss: 13.59
INFO:gensim.models.poincare:time taken for 10000 examples: 2.08 s, 4808.91 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #539990-#540000, loss: 13.42
INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4886.71 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #549990-#550000, loss: 13.52
INFO:gensim.models.poincare:time taken for 10000 examples: 2.09 s, 4789.03 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #559990-#560000, loss: 13.37
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4941.37 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #569990-#570000, loss: 13.49
INFO:gensim.models.poincare:time taken for 10000 examples: 2.17 s, 4612.78 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #579990-#580000, loss: 13.37
INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4864.96 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #589990-#590000, loss: 13.26
INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5015.37 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #599990-#600000, loss: 13.29
INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4871.27 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #609990-#610000, loss: 13.31
INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4965.43 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #619990-#620000, loss: 13.38
INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4897.68 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #629990-#630000, loss: 13.30
INFO:gensim.models.poincare:time taken for 10000 examples: 2.15 s, 4652.29 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #639990-#640000, loss: 13.46
INFO:gensim.models.poincare:time taken for 10000 examples: 2.29 s, 4368.10 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #649990-#650000, loss: 13.28
INFO:gensim.models.poincare:time taken for 10000 examples: 2.12 s, 4713.79 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #659990-#660000, loss: 13.22
INFO:gensim.models.poincare:time taken for 10000 examples: 2.10 s, 4761.11 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #669990-#670000, loss: 13.23
INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4979.03 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #679990-#680000, loss: 13.24
INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4903.94 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #689990-#690000, loss: 13.34
INFO:gensim.models.poincare:time taken for 10000 examples: 2.33 s, 4297.92 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #699990-#700000, loss: 13.18
INFO:gensim.models.poincare:time taken for 10000 examples: 2.20 s, 4537.09 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #709990-#710000, loss: 13.18
INFO:gensim.models.poincare:time taken for 10000 examples: 2.77 s, 3613.14 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #719990-#720000, loss: 13.16
INFO:gensim.models.poincare:time taken for 10000 examples: 2.74 s, 3655.50 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #729990-#730000, loss: 13.11
INFO:gensim.models.poincare:time taken for 10000 examples: 2.56 s, 3901.57 examples / s
INFO:gensim.models.poincare:training on epoch 6, examples #739990-#740000, loss: 13.06
INFO:gensim.models.poincare:time taken for 10000 examples: 2.11 s, 4740.70 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #9990-#10000, loss: 13.14
INFO:gensim.models.poincare:time taken for 10000 examples: 2.13 s, 4691.85 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #19990-#20000, loss: 12.79
INFO:gensim.models.poincare:time taken for 10000 examples: 2.23 s, 4486.93 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #29990-#30000, loss: 13.03
INFO:gensim.models.poincare:time taken for 10000 examples: 2.35 s, 4251.78 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #39990-#40000, loss: 12.81
INFO:gensim.models.poincare:time taken for 10000 examples: 2.33 s, 4284.35 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #49990-#50000, loss: 12.94
INFO:gensim.models.poincare:time taken for 10000 examples: 2.37 s, 4223.69 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #59990-#60000, loss: 12.86
INFO:gensim.models.poincare:time taken for 10000 examples: 2.57 s, 3896.02 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #69990-#70000, loss: 12.85
INFO:gensim.models.poincare:time taken for 10000 examples: 2.57 s, 3884.71 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #79990-#80000, loss: 12.79
INFO:gensim.models.poincare:time taken for 10000 examples: 2.72 s, 3671.83 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #89990-#90000, loss: 12.86
INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4873.00 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #99990-#100000, loss: 12.80
INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4899.85 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #109990-#110000, loss: 12.84
INFO:gensim.models.poincare:time taken for 10000 examples: 2.09 s, 4777.10 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #119990-#120000, loss: 12.65
INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5014.45 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #129990-#130000, loss: 12.83
INFO:gensim.models.poincare:time taken for 10000 examples: 2.03 s, 4923.53 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #139990-#140000, loss: 12.74
INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4870.12 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #149990-#150000, loss: 12.75
INFO:gensim.models.poincare:time taken for 10000 examples: 2.07 s, 4834.36 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #159990-#160000, loss: 12.73
INFO:gensim.models.poincare:time taken for 10000 examples: 2.09 s, 4775.92 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #169990-#170000, loss: 12.68
INFO:gensim.models.poincare:time taken for 10000 examples: 2.18 s, 4586.59 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #179990-#180000, loss: 12.81
INFO:gensim.models.poincare:time taken for 10000 examples: 2.18 s, 4597.17 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #189990-#190000, loss: 12.60
INFO:gensim.models.poincare:time taken for 10000 examples: 2.38 s, 4209.51 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #199990-#200000, loss: 12.73
INFO:gensim.models.poincare:time taken for 10000 examples: 2.49 s, 4012.02 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #209990-#210000, loss: 12.65
INFO:gensim.models.poincare:time taken for 10000 examples: 2.43 s, 4123.05 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #219990-#220000, loss: 12.82
INFO:gensim.models.poincare:time taken for 10000 examples: 2.38 s, 4200.99 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #229990-#230000, loss: 12.62
INFO:gensim.models.poincare:time taken for 10000 examples: 2.34 s, 4272.35 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #239990-#240000, loss: 12.67
INFO:gensim.models.poincare:time taken for 10000 examples: 2.20 s, 4544.30 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #249990-#250000, loss: 12.55
INFO:gensim.models.poincare:time taken for 10000 examples: 2.79 s, 3585.54 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #259990-#260000, loss: 12.61
INFO:gensim.models.poincare:time taken for 10000 examples: 3.22 s, 3108.58 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #269990-#270000, loss: 12.61
INFO:gensim.models.poincare:time taken for 10000 examples: 2.80 s, 3573.95 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #279990-#280000, loss: 12.46
INFO:gensim.models.poincare:time taken for 10000 examples: 3.37 s, 2965.71 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #289990-#290000, loss: 12.61
INFO:gensim.models.poincare:time taken for 10000 examples: 3.00 s, 3337.80 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #299990-#300000, loss: 12.54
INFO:gensim.models.poincare:time taken for 10000 examples: 3.23 s, 3092.74 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #309990-#310000, loss: 12.47
INFO:gensim.models.poincare:time taken for 10000 examples: 3.15 s, 3174.50 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #319990-#320000, loss: 12.42
INFO:gensim.models.poincare:time taken for 10000 examples: 3.00 s, 3337.83 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #329990-#330000, loss: 12.35
INFO:gensim.models.poincare:time taken for 10000 examples: 2.90 s, 3451.59 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #339990-#340000, loss: 12.54
INFO:gensim.models.poincare:time taken for 10000 examples: 2.95 s, 3388.02 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #349990-#350000, loss: 12.46
INFO:gensim.models.poincare:time taken for 10000 examples: 3.03 s, 3298.19 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #359990-#360000, loss: 12.35
INFO:gensim.models.poincare:time taken for 10000 examples: 3.15 s, 3178.27 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #369990-#370000, loss: 12.46
INFO:gensim.models.poincare:time taken for 10000 examples: 2.59 s, 3855.54 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #379990-#380000, loss: 12.55
INFO:gensim.models.poincare:time taken for 10000 examples: 3.41 s, 2929.58 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #389990-#390000, loss: 12.32
INFO:gensim.models.poincare:time taken for 10000 examples: 2.87 s, 3479.64 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #399990-#400000, loss: 12.38
INFO:gensim.models.poincare:time taken for 10000 examples: 3.25 s, 3076.83 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #409990-#410000, loss: 12.44
INFO:gensim.models.poincare:time taken for 10000 examples: 3.30 s, 3032.67 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #419990-#420000, loss: 12.27
INFO:gensim.models.poincare:time taken for 10000 examples: 2.46 s, 4070.55 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #429990-#430000, loss: 12.32
INFO:gensim.models.poincare:time taken for 10000 examples: 3.53 s, 2833.03 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #439990-#440000, loss: 12.39
INFO:gensim.models.poincare:time taken for 10000 examples: 2.37 s, 4218.68 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #449990-#450000, loss: 12.31
INFO:gensim.models.poincare:time taken for 10000 examples: 3.58 s, 2794.56 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #459990-#460000, loss: 12.49
INFO:gensim.models.poincare:time taken for 10000 examples: 2.74 s, 3653.72 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #469990-#470000, loss: 12.42
INFO:gensim.models.poincare:time taken for 10000 examples: 4.27 s, 2339.92 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #479990-#480000, loss: 12.39
INFO:gensim.models.poincare:time taken for 10000 examples: 3.70 s, 2705.51 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #489990-#490000, loss: 12.39
INFO:gensim.models.poincare:time taken for 10000 examples: 4.18 s, 2390.35 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #499990-#500000, loss: 12.29
INFO:gensim.models.poincare:time taken for 10000 examples: 4.27 s, 2340.87 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #509990-#510000, loss: 12.27
INFO:gensim.models.poincare:time taken for 10000 examples: 3.33 s, 2999.75 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #519990-#520000, loss: 12.16
INFO:gensim.models.poincare:time taken for 10000 examples: 4.92 s, 2033.91 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #529990-#530000, loss: 12.30
INFO:gensim.models.poincare:time taken for 10000 examples: 2.98 s, 3354.39 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #539990-#540000, loss: 12.26
INFO:gensim.models.poincare:time taken for 10000 examples: 3.52 s, 2839.14 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #549990-#550000, loss: 12.20
INFO:gensim.models.poincare:time taken for 10000 examples: 2.75 s, 3635.44 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #559990-#560000, loss: 12.09
INFO:gensim.models.poincare:time taken for 10000 examples: 3.15 s, 3171.62 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #569990-#570000, loss: 12.35
INFO:gensim.models.poincare:time taken for 10000 examples: 3.63 s, 2754.73 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #579990-#580000, loss: 12.28
INFO:gensim.models.poincare:time taken for 10000 examples: 2.97 s, 3369.52 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #589990-#590000, loss: 12.15
INFO:gensim.models.poincare:time taken for 10000 examples: 3.81 s, 2627.79 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #599990-#600000, loss: 12.11
INFO:gensim.models.poincare:time taken for 10000 examples: 3.46 s, 2890.72 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #609990-#610000, loss: 12.08
INFO:gensim.models.poincare:time taken for 10000 examples: 3.26 s, 3068.25 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #619990-#620000, loss: 11.95
INFO:gensim.models.poincare:time taken for 10000 examples: 3.61 s, 2772.22 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #629990-#630000, loss: 12.19
INFO:gensim.models.poincare:time taken for 10000 examples: 2.43 s, 4116.51 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #639990-#640000, loss: 12.09
INFO:gensim.models.poincare:time taken for 10000 examples: 3.68 s, 2716.18 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #649990-#650000, loss: 12.01
INFO:gensim.models.poincare:time taken for 10000 examples: 3.22 s, 3106.93 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #659990-#660000, loss: 12.19
INFO:gensim.models.poincare:time taken for 10000 examples: 3.30 s, 3029.78 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #669990-#670000, loss: 11.92
INFO:gensim.models.poincare:time taken for 10000 examples: 3.71 s, 2696.68 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #679990-#680000, loss: 11.89
INFO:gensim.models.poincare:time taken for 10000 examples: 2.76 s, 3623.31 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #689990-#690000, loss: 12.04
INFO:gensim.models.poincare:time taken for 10000 examples: 3.66 s, 2734.81 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #699990-#700000, loss: 12.01
INFO:gensim.models.poincare:time taken for 10000 examples: 2.80 s, 3570.62 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #709990-#710000, loss: 11.99
INFO:gensim.models.poincare:time taken for 10000 examples: 2.94 s, 3395.87 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #719990-#720000, loss: 12.01
INFO:gensim.models.poincare:time taken for 10000 examples: 3.05 s, 3275.42 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #729990-#730000, loss: 11.90
INFO:gensim.models.poincare:time taken for 10000 examples: 2.13 s, 4689.34 examples / s
INFO:gensim.models.poincare:training on epoch 7, examples #739990-#740000, loss: 11.97
INFO:gensim.models.poincare:time taken for 10000 examples: 2.96 s, 3380.01 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #9990-#10000, loss: 11.83
INFO:gensim.models.poincare:time taken for 10000 examples: 2.64 s, 3788.66 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #19990-#20000, loss: 11.96
INFO:gensim.models.poincare:time taken for 10000 examples: 2.72 s, 3678.58 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #29990-#30000, loss: 11.99
INFO:gensim.models.poincare:time taken for 10000 examples: 2.90 s, 3446.74 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #39990-#40000, loss: 11.81
INFO:gensim.models.poincare:time taken for 10000 examples: 2.39 s, 4183.87 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #49990-#50000, loss: 11.84
INFO:gensim.models.poincare:time taken for 10000 examples: 2.73 s, 3660.45 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #59990-#60000, loss: 11.78
INFO:gensim.models.poincare:time taken for 10000 examples: 2.87 s, 3487.22 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #69990-#70000, loss: 11.79
INFO:gensim.models.poincare:time taken for 10000 examples: 2.58 s, 3874.98 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #79990-#80000, loss: 11.77
INFO:gensim.models.poincare:time taken for 10000 examples: 1.96 s, 5106.28 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #89990-#90000, loss: 11.50
INFO:gensim.models.poincare:time taken for 10000 examples: 2.21 s, 4520.77 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #99990-#100000, loss: 11.84
INFO:gensim.models.poincare:time taken for 10000 examples: 2.69 s, 3712.38 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #109990-#110000, loss: 11.68
INFO:gensim.models.poincare:time taken for 10000 examples: 2.09 s, 4776.22 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #119990-#120000, loss: 11.71
INFO:gensim.models.poincare:time taken for 10000 examples: 1.92 s, 5221.46 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #129990-#130000, loss: 11.77
INFO:gensim.models.poincare:time taken for 10000 examples: 1.92 s, 5217.92 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #139990-#140000, loss: 11.77
INFO:gensim.models.poincare:time taken for 10000 examples: 1.88 s, 5322.30 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #149990-#150000, loss: 11.68
INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5061.23 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #159990-#160000, loss: 11.73
INFO:gensim.models.poincare:time taken for 10000 examples: 1.88 s, 5312.49 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #169990-#170000, loss: 11.58
INFO:gensim.models.poincare:time taken for 10000 examples: 1.85 s, 5396.36 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #179990-#180000, loss: 11.64
INFO:gensim.models.poincare:time taken for 10000 examples: 1.89 s, 5283.63 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #189990-#190000, loss: 11.76
INFO:gensim.models.poincare:time taken for 10000 examples: 1.86 s, 5373.89 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #199990-#200000, loss: 11.62
INFO:gensim.models.poincare:time taken for 10000 examples: 1.89 s, 5286.74 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #209990-#210000, loss: 11.66
INFO:gensim.models.poincare:time taken for 10000 examples: 1.93 s, 5171.18 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #219990-#220000, loss: 11.57
INFO:gensim.models.poincare:time taken for 10000 examples: 1.88 s, 5317.64 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #229990-#230000, loss: 11.56
INFO:gensim.models.poincare:time taken for 10000 examples: 1.86 s, 5383.91 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #239990-#240000, loss: 11.70
INFO:gensim.models.poincare:time taken for 10000 examples: 1.91 s, 5238.90 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #249990-#250000, loss: 11.43
INFO:gensim.models.poincare:time taken for 10000 examples: 1.89 s, 5300.40 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #259990-#260000, loss: 11.70
INFO:gensim.models.poincare:time taken for 10000 examples: 1.89 s, 5288.47 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #269990-#270000, loss: 11.60
INFO:gensim.models.poincare:time taken for 10000 examples: 1.97 s, 5083.33 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #279990-#280000, loss: 11.63
INFO:gensim.models.poincare:time taken for 10000 examples: 1.92 s, 5195.28 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #289990-#290000, loss: 11.42
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4996.08 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #299990-#300000, loss: 11.59
INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5009.45 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #309990-#310000, loss: 11.63
INFO:gensim.models.poincare:time taken for 10000 examples: 1.90 s, 5264.10 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #319990-#320000, loss: 11.56
INFO:gensim.models.poincare:time taken for 10000 examples: 1.88 s, 5316.93 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #329990-#330000, loss: 11.47
INFO:gensim.models.poincare:time taken for 10000 examples: 1.94 s, 5167.73 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #339990-#340000, loss: 11.51
INFO:gensim.models.poincare:time taken for 10000 examples: 1.87 s, 5337.89 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #349990-#350000, loss: 11.54
INFO:gensim.models.poincare:time taken for 10000 examples: 1.86 s, 5389.10 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #359990-#360000, loss: 11.49
INFO:gensim.models.poincare:time taken for 10000 examples: 1.86 s, 5362.69 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #369990-#370000, loss: 11.43
INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4896.47 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #379990-#380000, loss: 11.45
INFO:gensim.models.poincare:time taken for 10000 examples: 2.08 s, 4814.46 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #389990-#390000, loss: 11.48
INFO:gensim.models.poincare:time taken for 10000 examples: 2.08 s, 4818.25 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #399990-#400000, loss: 11.38
INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4960.60 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #409990-#410000, loss: 11.45
INFO:gensim.models.poincare:time taken for 10000 examples: 1.86 s, 5365.93 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #419990-#420000, loss: 11.59
INFO:gensim.models.poincare:time taken for 10000 examples: 1.92 s, 5206.10 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #429990-#430000, loss: 11.39
INFO:gensim.models.poincare:time taken for 10000 examples: 1.88 s, 5322.43 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #439990-#440000, loss: 11.45
INFO:gensim.models.poincare:time taken for 10000 examples: 1.89 s, 5301.80 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #449990-#450000, loss: 11.45
INFO:gensim.models.poincare:time taken for 10000 examples: 1.91 s, 5242.85 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #459990-#460000, loss: 11.30
INFO:gensim.models.poincare:time taken for 10000 examples: 1.94 s, 5149.58 examples / s
INFO:gensim.models.poincare:training on epoch 8, examples #469990-#470000, loss: 11.45
INFO:gensim.models.poincare:time taken for 10000 examples: 1.92 s, 5211.54 examples / s

3. Loading the embeddings


In [ ]:
def transform_cpp_embedding_to_kv(input_file, output_file, encoding='utf8'):
    """Given a C++ embedding tsv filepath, converts it to a KeyedVector-supported file"""
    with smart_open(input_file, 'rb') as f:
        lines = [line.decode(encoding) for line in f]
    if not len(lines):
         raise ValueError("file is empty")
    first_line = lines[0]
    parts = first_line.rstrip().split("\t")
    model_size = len(parts) - 1
    vocab_size = len(lines)
    with smart_open(output_file, 'w') as f:
        f.write('%d %d\n' % (vocab_size, model_size))
        for line in lines:
            f.write(line.replace('\t', ' '))

def transform_numpy_embedding_to_kv(input_file, output_file, encoding='utf8'):
    """Given a numpy poincare embedding pkl filepath, converts it to a KeyedVector-supported file"""
    np_embeddings = pickle.load(open(input_file, 'rb'))
    random_embedding = np_embeddings[list(np_embeddings.keys())[0]]
    
    model_size = random_embedding.shape[0]
    vocab_size = len(np_embeddings)
    with smart_open(output_file, 'w') as f:
        f.write('%d %d\n' % (vocab_size, model_size))
        for key, vector in np_embeddings.items():
            vector_string = ' '.join('%.6f' % value for value in vector)
            f.write('%s %s\n' % (key, vector_string))

def load_poincare_cpp(input_filename):
    """Load embedding trained via C++ Poincare model.

    Parameters
    ----------
    filepath : str
        Path to tsv file containing embedding.

    Returns
    -------
    PoincareKeyedVectors instance.

    """
    keyed_vectors_filename = input_filename + '.kv'
    transform_cpp_embedding_to_kv(input_filename, keyed_vectors_filename)
    embedding = PoincareKeyedVectors.load_word2vec_format(keyed_vectors_filename)
    os.unlink(keyed_vectors_filename)
    return embedding

def load_poincare_numpy(input_filename):
    """Load embedding trained via Python numpy Poincare model.

    Parameters
    ----------
    filepath : str
        Path to pkl file containing embedding.

    Returns:
        PoincareKeyedVectors instance.

    """
    keyed_vectors_filename = input_filename + '.kv'
    transform_numpy_embedding_to_kv(input_filename, keyed_vectors_filename)
    embedding = PoincareKeyedVectors.load_word2vec_format(keyed_vectors_filename)
    os.unlink(keyed_vectors_filename)
    return embedding

def load_poincare_gensim(input_filename):
    """Load embedding trained via Gensim PoincareModel.

    Parameters
    ----------
    filepath : str
        Path to model file.

    Returns:
        PoincareKeyedVectors instance.

    """
    model = PoincareModel.load(input_filename)
    return model.kv

def load_model(implementation, model_file):
    """Convenience function over functions to load models from different implementations.
    
    Parameters
    ----------
    implementation : str
        Implementation used to create model file ('c++'/'numpy'/'gensim').
    model_file : str
        Path to model file.
    
    Returns
    -------
    PoincareKeyedVectors instance
    
    Notes
    -----
    Raises ValueError in case of invalid value for `implementation`

    """
    if implementation == 'c++':
        return load_poincare_cpp(model_file)
    elif implementation == 'numpy':
        return load_poincare_numpy(model_file)
    elif implementation == 'gensim':
        return load_poincare_gensim(model_file)
    else:
        raise ValueError('Invalid implementation %s' % implementation)

4. Evaluation


In [ ]:
def display_results(task_name, results):
    """Display evaluation results of multiple embeddings on a single task in a tabular format
    
    Args:
        task_name (str): name the task being evaluated
        results (dict): mapping between embeddings and corresponding results
    
    """
    result_table = PrettyTable()
    result_table.field_names = ["Model Description", "Metric"] + [str(dim) for dim in sorted(model_sizes)]
    for model_name, model_results in results.items():
        metrics = [metric for metric in model_results.keys()]
        dims = sorted([dim for dim in model_results[metrics[0]].keys()])
        description = model_description_from_name(model_name)
        row = [description, '\n'.join(metrics) + '\n']
        for dim in dims:
            scores = ['%.2f' % model_results[metric][dim] for metric in metrics]
            row.append('\n'.join(scores))
        result_table.add_row(row)
    result_table.align = 'r'
    result_html = result_table.get_html_string()
    search = "<table>"
    insert_at = result_html.index(search) + len(search)
    new_row = """
        <tr>
            <th colspan="1" style="text-align:left">%s</th>
            <th colspan="1"></th>
            <th colspan="%d" style="text-align:center"> Dimensions</th>
        </tr>""" % (task_name, len(model_sizes))
    result_html = result_html[:insert_at] + new_row + result_html[insert_at:]
    display(HTML(result_html))
    
def model_description_from_name(model_name):
    if model_name.startswith('gensim'):
        implementation = 'Gensim'
    elif model_name.startswith('cpp'):
        implementation = 'C++'
    elif model_name.startswith('np'):
        implementation = 'Numpy'
    else:
        raise ValueError('Unsupported implementation for model: %s' % model_name)
    description = []
    for param_key in sorted(default_params.keys()):
        pattern = '%s_([^_]*)_?' % param_key
        match = re.search(pattern, model_name)
        if match:
            description.append("%s=%s" % (param_key, match.groups()[0]))
    return "%s: %s" % (implementation, ", ".join(description))

4.1 WordNet reconstruction

For this task, embeddings are learnt using the entire transitive closure of the WordNet noun hypernym hierarchy. Subsequently, for every hypernym pair (u, v), the rank of v amongst all nodes that do not have a positive edge with v is computed. The final metric mean_rank is the average of all these ranks. The MAP metric is the mean of the Average Precision of the rankings for all positive nodes for a given node u.

Note that this task tests representation capacity of the learnt embeddings, and not the generalization ability.


In [ ]:
reconstruction_results = OrderedDict()
metrics = ['mean_rank', 'MAP']

In [ ]:
for implementation, models in sorted(model_files.items()):
    for model_name, files in models.items():
        if model_name in reconstruction_results:
            continue
        reconstruction_results[model_name] = OrderedDict()
        for metric in metrics:
            reconstruction_results[model_name][metric] = {}
        for model_size, model_file in files.items():
            print('Evaluating model %s of size %d' % (model_name, model_size))
            embedding = load_model(implementation, model_file)
            eval_instance = ReconstructionEvaluation(wordnet_file, embedding)
            eval_result = eval_instance.evaluate(max_n=1000)
            for metric in metrics:
                reconstruction_results[model_name][metric][model_size] = eval_result[metric]

In [ ]:
display_results('WordNet Reconstruction', reconstruction_results)

Results from the paper -

The figures above illustrate a few things -

  1. The gensim implementation does significantly better for all model sizes and hyperparameters than both the other implementations.
  2. The results from the original paper have not been achieved by our implementation. Especially for models with lower dimensions, the paper mentions significantly better mean rank and MAP for the reconstruction task.
  3. Using burn-in and regularization leads to much better results with low model sizes, however the results do not improve significantly with increasing model size. This might have to do with tuning the regularization coefficient, which the paper does not mention.

This task is similar to the reconstruction task described above, except that the list of relations is split into a training and testing set, and the mean rank reported is for the edges in the test set.

Therefore, this tests the ability of the model to predict unseen edges between nodes, i.e. generalization ability, as opposed to the representation capacity tested in the Reconstruction task

4.2.1 Preparing data


In [ ]:
def train_test_split(data_file, test_ratio=0.1):
    """Creates train and test files from given data file, returns train/test file names
    
    Args:
        data_file (str): path to data file for which train/test split is to be created
        test_ratio (float): fraction of lines to be used for test data
    
    Returns
        (train_file, test_file): tuple of strings with train file and test file paths
    """
    train_filename = data_file + '.train'
    test_filename = data_file + '.test'
    if os.path.exists(train_filename) and os.path.exists(test_filename):
        print('Train and test files already exist, skipping')
        return (train_filename, test_filename)
    root_nodes, leaf_nodes = get_root_and_leaf_nodes(data_file)
    test_line_candidates = []
    line_count = 0
    all_nodes = set()
    with smart_open(data_file, 'rb') as f:
        for i, line in enumerate(f):
            node_1, node_2 = line.split()
            all_nodes.update([node_1, node_2])
            if (
                    node_1 not in leaf_nodes
                    and node_2 not in leaf_nodes
                    and node_1 not in root_nodes
                    and node_2 not in root_nodes
                    and node_1 != node_2
                ):
                test_line_candidates.append(i)
            line_count += 1

    num_test_lines = int(test_ratio * line_count)
    if num_test_lines > len(test_line_candidates):
        raise ValueError('Not enough candidate relations for test set')
    print('Choosing %d test lines from %d candidates' % (num_test_lines, len(test_line_candidates)))
    test_line_indices = set(random.sample(test_line_candidates, num_test_lines))
    train_line_indices = set(l for l in range(line_count) if l not in test_line_indices)
    
    train_set_nodes = set()
    with smart_open(data_file, 'rb') as f:
        train_file = smart_open(train_filename, 'wb')
        test_file = smart_open(test_filename, 'wb')
        for i, line in enumerate(f):
            if i in train_line_indices:
                train_set_nodes.update(line.split())
                train_file.write(line)
            elif i in test_line_indices:
                test_file.write(line)
            else:
                raise AssertionError('Line %d not present in either train or test line indices' % i)
        train_file.close()
        test_file.close()
    assert len(train_set_nodes) == len(all_nodes), 'Not all nodes from dataset present in train set relations'
    return (train_filename, test_filename)

In [ ]:
def get_root_and_leaf_nodes(data_file):
    """Return keys of root and leaf nodes from a file with transitive closure relations
    
    Args:
        data_file(str): file path containing transitive closure relations
    
    Returns:
        (root_nodes, leaf_nodes) - tuple containing keys of root and leaf nodes
    """
    root_candidates = set()
    leaf_candidates = set()
    with smart_open(data_file, 'rb') as f:
        for line in f:
            nodes = line.split()
            root_candidates.update(nodes)
            leaf_candidates.update(nodes)
    
    with smart_open(data_file, 'rb') as f:
        for line in f:
            node_1, node_2 = line.split()
            if node_1 == node_2:
                continue
            leaf_candidates.discard(node_1)
            root_candidates.discard(node_2)
    
    return (leaf_candidates, root_candidates)

In [ ]:
wordnet_train_file, wordnet_test_file = train_test_split(wordnet_file)

4.2.2 Training models


In [ ]:
# Training models for link prediction
lp_model_files = {}

In [ ]:
lp_model_files['c++'] = {}
# Train c++ models with default params
model_name, files = train_model_with_params(default_params, wordnet_train_file, model_sizes, 'cpp_lp_model', 'c++')
lp_model_files['c++'][model_name] = {}
for dim, filepath in files.items():
    lp_model_files['c++'][model_name][dim] = filepath
# Train c++ models with non-default params
for param, values in non_default_params.items():
    params = default_params.copy()
    for value in values:
        params[param] = value
        model_name, files = train_model_with_params(params, wordnet_train_file, model_sizes, 'cpp_lp_model', 'c++')
        lp_model_files['c++'][model_name] = {}
        for dim, filepath in files.items():
            lp_model_files['c++'][model_name][dim] = filepath

In [ ]:
lp_model_files['numpy'] = {}
# Train numpy models with default params
model_name, files = train_model_with_params(default_params, wordnet_train_file, model_sizes, 'np_lp_model', 'numpy')
lp_model_files['numpy'][model_name] = {}
for dim, filepath in files.items():
    lp_model_files['numpy'][model_name][dim] = filepath

In [ ]:
lp_model_files['gensim'] = {}
# Train models with default params
model_name, files = train_model_with_params(default_params, wordnet_train_file, model_sizes, 'gensim_lp_model', 'gensim')
lp_model_files['gensim'][model_name] = {}
for dim, filepath in files.items():
    lp_model_files['gensim'][model_name][dim] = filepath
# Train models with non-default params
for new_params in non_default_params_gensim:
    params = default_params.copy()
    params.update(new_params)
    model_name, files = train_model_with_params(params, wordnet_file, model_sizes, 'gensim_lp_model', 'gensim')
    lp_model_files['gensim'][model_name] = {}
    for dim, filepath in files.items():
        lp_model_files['gensim'][model_name][dim] = filepath

4.2.3 Evaluating models


In [ ]:
lp_results = OrderedDict()
metrics = ['mean_rank', 'MAP']

In [ ]:
for implementation, models in sorted(lp_model_files.items()):
    for model_name, files in models.items():
        lp_results[model_name] = OrderedDict()
        for metric in metrics:
            lp_results[model_name][metric] = {}
        for model_size, model_file in files.items():
            print('Evaluating model %s of size %d' % (model_name, model_size))
            embedding = load_model(implementation, model_file)
            eval_instance = LinkPredictionEvaluation(wordnet_train_file, wordnet_test_file, embedding)
            eval_result = eval_instance.evaluate(max_n=1000)
            for metric in metrics:
                lp_results[model_name][metric][model_size] = eval_result[metric]

In [ ]:
display_results('WordNet Link Prediction', lp_results)

Results from the paper -

These results follow similar trends as the reconstruction results. Repeating here for ease of reading -

  1. The gensim implementation does significantly better for all model sizes and hyperparameters than both the other implementations.
  2. The results from the original paper have not been achieved by our implementation. Especially for models with lower dimensions, the paper mentions significantly better mean rank and MAP for the link prediction task.
  3. Using burn-in and regularization leads to better results with low model sizes, however the results do not improve significantly with increasing model size.

The main difference from the reconstruction results is that mean ranks for link prediction are slightly worse most of the time than the corresponding reconstruction results. This is to be expected, as link prediction is performed on a held-out test set.

4.3 HyperLex Lexical Entailment

The Lexical Entailment task is performed using the HyperLex dataset, a collection of 2163 noun pairs and scores that denote "To what degree is noun A a type of noun Y". For example -

girl person 9.85

These scores are out of 10.

The spearman's correlation score is computed for the predicted and actual similarity scores, with the models trained on the entire WordNet noun hierarchy.


In [ ]:
entailment_results = OrderedDict()
eval_instance = LexicalEntailmentEvaluation(hyperlex_file)

In [ ]:
for implementation, models in sorted(model_files.items()):
    for model_name, files in models.items():
        if model_name in entailment_results:
            continue
        entailment_results[model_name] = OrderedDict()
        entailment_results[model_name]['spearman'] = {}
        for model_size, model_file in files.items():
            print('Evaluating model %s of size %d' % (model_name, model_size))
            embedding = load_model(implementation, model_file)
            entailment_results[model_name]['spearman'][model_size] = eval_instance.evaluate_spearman(embedding)

In [ ]:
display_results('Lexical Entailment (HyperLex)', entailment_results)

Results from paper (for Poincaré Embeddings, as well as other embeddings from previous papers) -

Some observations -

  1. We achieve a max spearman score of 0.48, fairly close to the spearman score of 0.512 mentioned in the paper.
  2. The best results are obtained with 20 negative examples, a batch size of 10, and no burn-in, however the differences are too low to make a meaningful conclusion.

However, there are a few ambiguities and caveats -

  1. The paper does not mention which hyperparameters and model size have been used for the above mentioned result. Hence it is possible that the results are achieved with a significantly lower model size than the one we use, which would imply that our implementation still has some way to go.
  2. The same word can have multiple nodes in the WordNet dataset for different senses of the word, and it is unclear in the paper how to decide which node to pick. For the above results, we have gone with the sane default of picking the particular sense that has the maximum similarity score with the target word.
  3. Certain words in the HyperLex dataset seem to be absent from the WordNet data - the paper does not mention any such thing. Pairs containing missing words have been omitted from the evaluation (182/2163).

The paper also describes a variant of the Poincaré model to learn embeddings of nodes in a symmetric graph, unlike the WordNet noun hierarchy, which is directed and asymmetric. The datasets used in the paper for this model are scientific collaboration networks, in which the nodes are researchers and an edge represents that the two researchers have co-authored a paper.

This variant has not been implemented yet, and is therefore not a part of our experiments.

5. Next Steps

  1. The model can be investigated further to understand why it doesn't produce results as good as the paper. It is possible that this might be due to training details not present in the paper, or due to us incorrectly interpreting some ambiguous parts of the paper. We have not been able to clarify all such ambiguities in communication with the authors.
  2. Optimizing the training process further - with a model size of 50 dimensions and a dataset with ~700k relations and ~80k nodes, the Gensim implementation takes around 45 seconds to complete an epoch (~15k relations per second), whereas the open source C++ implementation takes around 1/6th the time (~95k relations per second).
  3. Implementing the variant of the model mentioned in the paper for symmetric graphs and evaluating on the scientific collaboration datasets described earlier in the report.